Easy2Siksha.com
o The variance of errors is constant across all values of X.
o If variance increases with X, results may be unreliable.
4. Normality of Errors:
o Errors are normally distributed, especially important for hypothesis testing.
5. No Perfect Multicollinearity:
o In simple regression, only one predictor is used, so this assumption is
naturally satisfied.
6. Exogeneity:
o The independent variable X is not correlated with the error term.
o If violated, estimates become biased.
Example to Illustrate
Suppose we study the effect of hours studied (X) on exam score (Y).
• Collect data from 10 students.
• Use least squares to estimate
and
.
• If
, it means each extra hour of study increases the score by 5 marks.
• If
, it means a student who studies 0 hours is expected to score 20 marks.
Conclusion
The least squares method derives regression coefficients by minimizing the sum of squared
errors, leading to formulas for
and
. The model rests on assumptions like linearity,
independence, homoscedasticity, and normality.
SECTION – B
3.State and prove Gauss–Markov’s Theorem for a general linear regression model.
Ans: Gauss–Markov’s Theorem (General Linear Regression Model)
1. The General Linear Regression Model
First, we need to understand the setting in which the theorem works.
A general linear regression model can be written in matrix form as:
Let’s decode this in simple terms:
• Y → vector of observed dependent variable values
• X → matrix of independent variables (predictors)